About Us
The Shared Reality Lab leverages our engineering skills and human-centered design experience to tackle a variety of problems of importance to society. We work with audio, video, and haptic technologies, mixed reality, machine learning, and mobile computing, building systems that facilitate and enrich both human-computer and computer-mediated human-human interaction. Active projects include audio and audio-haptic rendering of graphics for blind users, development of conversational avatars for healthcare applications, design of the flight deck of the future, and multimodal delivery of information to users in high-consequence environments. For questions about the lab, please contact Prof. Jeremy Cooperstock.
The Shared Reality Lab is currently funded by grants and contracts from the Natural Sciences and Engineering Research Council, Healthy Brains, Healthy Lives, CRIAQ, and Humanware. Past funding sources include the Fonds Nature et technologies, Sécurité publique Québec, Canadian Internet Registry Association, Networks of Centres of Excellence, Minstère du développement économique, de l’innovation et de l’exportation, Secrétariat du Conseil du trésor, CANARIE, and Innovation, Science and Economic Development Canada, as well as industrial support from HP Labs, Google Research, Mozilla, InterDigital Corporation, Haply Robotics, and iMD Research.